Wasserstein Identity Testing
نویسندگان
چکیده
Uniformity testing and the more general identity testing are well studied problems in distributional property testing. Most previous work focuses on testing under L1-distance. However, when the support is very large or even continuous, testing under L1-distance may require a huge (even infinite) number of samples. Motivated by such issues, we consider the identity testing in Wasserstein distance (a.k.a. transportation distance and earthmover distance) on a metric space (discrete or continuous). In this paper, we propose the Wasserstein identity testing problem (Identity Testing in Wasserstein distance). We obtain nearly optimal worst-case sample complexity for the problem. Moreover, for a large class of probability distributions satisfying the so-called ”Doubling Condition”, we provide nearly instance-optimal sample complexity.
منابع مشابه
Coordinate-wise Transformation of Probability Distributions to Achieve a Stein-type Identity
It is shown that for any given multi-dimensional probability distribution, there exists a unique coordinate-wise transformation such that the transformed distribution satisfies a Stein-type identity. The proof is based on an energy minimization problem over a subset of the Wasserstein space. The result is interpreted as a generalization of the diagonal scaling theorem established by Marshall an...
متن کاملOn Wasserstein Two-Sample Testing and Related Families of Nonparametric Tests
Nonparametric two-sample or homogeneity testing is a decision theoretic problem that involves identifying differences between two random variables without making parametric assumptions about their underlying distributions. The literature is old and rich, with a wide variety of statistics having being designed and analyzed, both for the unidimensional and the multivariate setting. In this short ...
متن کاملA note on reinforcement learning with Wasserstein distance regularisation, with applications to multipolicy learning
In this note we describe an application of Wasserstein distance to Reinforcement Learning. The Wasserstein distance in question is between the distribution of mappings of trajectories of a policy into some metric space, and some other fixed distribution (which may, for example, come from another policy). Different policies induce different distributions, so given an underlying metric, the Wasse...
متن کاملThe Cramer Distance as a Solution to Biased Wasserstein Gradients
The Wasserstein probability metric has received much attention from the machine learning community. Unlike the Kullback-Leibler divergence, which strictly measures change in probability, the Wasserstein metric reflects the underlying geometry between outcomes. The value of being sensitive to this geometry has been demonstrated, among others, in ordinal regression and generative modelling. In th...
متن کاملWasserstein CNN: Learning Invariant Features for NIR-VIS Face Recognition
Heterogeneous face recognition (HFR) aims to match facial images acquired from different sensing modalities with mission-critical applications in forensics, security and commercial sectors. However, HFR is a much more challenging problem than traditional face recognition because of large intra-class variations of heterogeneous face images and limited training samples of cross-modality face imag...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1710.10457 شماره
صفحات -
تاریخ انتشار 2017